West Germany — synonyms, definition

1. West Germany (Noun)

1 synonym
Federal Republic of Germany
1 definition

West Germany (Noun) — A republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990.

2 types of
European country European nation